Direct l_(2, p)-Norm Learning for Feature Selection
نویسندگان
چکیده
In this paper , we propose a novel sparse learning based feature selection method that directly optimizes a large margin linear classification model’s sparsity with -norm ( ) subject to data-fitt ing constraints, rather than using the sparsity as a regularizat ion term. To solve the d irect sparsity opt imizat ion p rob lem that is non-s mooth and non-convex when , we prov ide an efficient iterat ive algorithm with p roved convergence by convert ing it to a convex and s mooth optimizat ion prob lem at every iterat ion step . The proposed algorithm has been evaluated based on publicly availab le datasets , and extensive comparison experiments have demonstrated that our algorithm could ach ieve featu re select ion performance compet it ive to state-o f-the-art algorithms.
منابع مشابه
$l_{2,p}$ Matrix Norm and Its Application in Feature Selection
Recently, l2,1 matrix norm has been widely applied to many areas such as computer vision, pattern recognition, biological study and etc. As an extension of l1 vector norm, the mixed l2,1 matrix norm is often used to find jointly sparse solutions. Moreover, an efficient iterative algorithm has been designed to solve l2,1-norm involved minimizations. Actually, computational studies have showed th...
متن کاملFeature Selection at the Discrete Limit
Feature selection plays an important role in many machine learning and data mining applications. In this paper, we propose to use L2,p norm for feature selection with emphasis on small p. As p → 0, feature selection becomes discrete feature selection problem. We provide two algorithms, proximal gradient algorithm and rankone update algorithm, which is more efficient at large regularization λ. W...
متن کاملNon-Convex Feature Learning via `p,∞ Operator
We present a feature selection method for solving sparse regularization problem, which has a composite regularization of `p norm and `∞ norm. We use proximal gradient method to solve this `p,∞ operator problem, where a simple but efficient algorithm is designed to minimize a relatively simple objective function, which contains a vector of `2 norm and `∞ norm. Proposed method brings some insight...
متن کاملUncorrelated Group LASSO
`2,1-norm is an effective regularization to enforce a simple group sparsity for feature learning. To capture some subtle structures among feature groups, we propose a new regularization called exclusive group `2,1-norm. It enforces the sparsity at the intra-group level by using `2,1-norm, while encourages the selected features to distribute in different groups by using `2 norm at the inter-grou...
متن کاملSequential and Mixed Genetic Algorithm and Learning Automata (SGALA, MGALA) for Feature Selection in QSAR
Feature selection is of great importance in Quantitative Structure-Activity Relationship (QSAR) analysis. This problem has been solved using some meta-heuristic algorithms such as: GA, PSO, ACO, SA and so on. In this work two novel hybrid meta-heuristic algorithms i.e. Sequential GA and LA (SGALA) and Mixed GA and LA (MGALA), which are based on Genetic algorithm and learning automata for QSAR f...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- CoRR
دوره abs/1504.00430 شماره
صفحات -
تاریخ انتشار 2015